The first open-source 2-billion-parameter native 1-bit large language model developed by Microsoft Research, trained on 4 trillion tokens, demonstrating that native 1-bit large language models can significantly improve computational efficiency while maintaining performance comparable to full-precision open-source models of the same scale.
Large Language Model
Transformers English